Comparison of Methods for Initializing Em Algorithm for Estimation of Parameters of Gaussian, Multi-component, Heteroscedastic Mixture Models
نویسندگان
چکیده
A basic approach to estimation of mixture model parameters is by using expectation maximization (EM) algorithm for maximizing the likelihood function. However, it is essential to provide the algorithm with proper initial conditions, as it is highly dependent on the first estimation (“guess”) of parameters of a mixture. This paper presents several different initial condition estimation methods, which may be used as a first step in the EM parameter estimation procedure. We present comparisons of different initialization methods for heteroscedastic, multi-component Gaussian mixtures.
منابع مشابه
Initializing EM algorithm for univariate Gaussian, multi-component, heteroscedastic mixture models by dynamic programming partitions
In this paper we present and evaluate a methodology of estimating initial values of parameters of univariate, heteroscedastic Gaussian mixtures, on the basis of the dynamic programming algorithm for partitioning the range of observations into bins. We evaluate variants of dynamic programming method correspodnig to different scoring functions for partitioning. For both simulated and real data-se...
متن کاملImage Segmentation using Gaussian Mixture Model
Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact,...
متن کاملIMAGE SEGMENTATION USING GAUSSIAN MIXTURE MODEL
Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we have learned Gaussian mixture model to the pixels of an image. The parameters of the model have estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image is made by Bayes rule. In fact, ...
متن کاملNegative Selection Based Data Classification with Flexible Boundaries
One of the most important artificial immune algorithms is negative selection algorithm, which is an anomaly detection and pattern recognition technique; however, recent research has shown the successful application of this algorithm in data classification. Most of the negative selection methods consider deterministic boundaries to distinguish between self and non-self-spaces. In this paper, two...
متن کاملEstimating Gaussian Mixture Models from Data with Missing Features
Maximum likelihood (ML) tting of Gaussian mixture models (GMMs) to feature data is most e ciently handled by the EM algorithm [1, 2, 3, 4]. The EM algorithm is directly applicable to multivariate data in which all the features are always present, and there are no missing values. Unfortunately, missing values are common: caused either by random or systematic e ects. This study presents a novel a...
متن کامل